Neural Processing of Complex Continual Input

نویسنده

  • Felix A. Gers
چکیده

Long Short-Term Memory (LSTM) can learn algorithms for temporal pattern processing not learnable by alternative recurrent neural networks (RNNs) or other methods such as Hidden Markov Models (HMMs) and symbolic grammar learning (SGL). Here we present tasks involving arithmetic operations on continual input streams that even LSTM cannot solve. But an LSTM variant based on \forget gates," a recent extension, has superior arithmetic capabilities and does solve the tasks.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Prediction of breeding values for the milk production trait in Iranian Holstein cows applying artificial neural networks

The artificial neural networks, the learning algorithms and mathematical models mimicking the information processing ability of human brain can be used non-linear and complex data. The aim of this study was to predict the breeding values for milk production trait in Iranian Holstein cows applying artificial neural networks. Data on 35167 Iranian Holstein cows recorded between 1998 to 2009 were ...

متن کامل

Enhancing Efficiency of Neural Network Model in Prediction of Firms Financial Crisis Using Input Space Dimension Reduction Techniques

The main focus in this study is on data pre-processing, reduction in number of inputs or input space size reduction the purpose of which is the justified generalization of data set in smaller dimensions without losing the most significant data. In case the input space is large, the most important input variables can be identified from which insignificant variables are eliminated, or a variable ...

متن کامل

Active Long Term Memory Networks

Continual Learning in artificial neural networks suffers from interference and forgetting when different tasks are learned sequentially. This paper introduces the Active Long Term Memory Networks (A-LTM), a model of sequential multitask deep learning that is able to maintain previously learned association between sensory input and behavioral output while acquiring knew knowledge. A-LTM exploits...

متن کامل

Learning to Forget: Continual Prediction with LSTM

Long short-term memory (LSTM; Hochreiter & Schmidhuber, 1997) can solve numerous tasks not solvable by previous learning algorithms for recurrent neural networks (RNNs). We identify a weakness of LSTM networks processing continual input streams that are not a priori segmented into subsequences with explicitly marked ends at which the network's internal state could be reset. Without resets, the ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000